ORG response to Data: a new direction
We submitted our response to Government plans to deregulate data protection, also known as “Data: a new direction”. Starting with the National Data Strategy, the TIGRR report and the Plan for Digital Regulation, we have long opposed this unprecedented assault on the digital rights of UK residents.
Data-driven technologies have shown the potential to harm and discriminate in employment, finance, healthcare, credit score, insurance, housing, and education.
The General Data Protection Regulation provides for much needed legal protection and remedies against the growing weaponisation of personal data. In its stead, the Government proposes a cynical corporate rulebook designed to “unleash the power of data across the economy”. Human rights and human dignity are never mentioned at best or treated as an obstacle in the path to innovation at worst.
We challenge this view in our response, exposing the impact of these proposals on our rights. Before then, we organised a roundtable together with the Privacy and Consumer Advisory Group, and engaged with civil society organisations, the immigration sector, and health groups.
Our campaign to stop data discrimination doesn’t end here. We will continue to challenge and raise awareness about Government plans and their adverse impact on our rights. An official Government response to this consultation is expected in spring 2022.
Unleashing creepy and underhanded uses of data
Purpose limitation is a cornerstone of data protection. If you test for covid19, your swab (genetic data) can be used to determine if you are infected, but your DNA samples cannot be sold to pharmaceutical companies for commercial purposes. If you’re going to a hospital, doctors will use your information to take care of you, rather than reporting you to immigration authorities.
In other words, purposes limitation is a fundamental safeguard. It ensures that your information is being used in expected, reasonable ways that are clear from the outset. Government proposals would undermine this principle by:
- Reducing safeguards over the reuse of data for scientific research purposes. The Government also propose to expand the definition of “scientific research”, possibly beyond what is reasonable.
- Easing the reuse of data for other reasons than the ones they were collected for, also by different organisations than the ones they collected your data in the first place.
- Reduce safeguards around the reuse of data for incompatible purposes in the “substantial public interest”.
This would undermine trust in legitimate research activities, expose individuals to creepy uses of their data, and reduce safeguards around Government seizure and reuse of personal data.
Legalising harm and discrimination
Another important cornerstone of the UK GDPR is the principle of legality. An organisation may collect and use personal data to fulfil a legal duty — be it a contract, an invested authority, or another legal obligation. Beyond these cases, organisations will likely be required to ask your consent. This ensures that individuals have control over how their data is used.
Alternatively, organisations may also collect and use personal data for their “legitimate interest”, but they will need to balance their interest against the risks for the individuals involved. This ensures that their use is uncontroversial or lacks the potential to harm or discriminate to pass the “balancing test”.
However, the UK Government are proposing to scrap the balancing test for a list of “Legitimate Interests”. Further, the list the Government is proposing is so vague that it is difficult to foresee a scenario where an organisation wouldn’t be able to claim a legitimate interest. For instance:
- Profiling is a commercial practice that consists in discriminating individuals based on their behaviour or personal traits. The legitimate interest of “improving services for customers” would allow organisations to claim that they are “personalising customers’ experience”, even if this results in discriminatory or exclusionary practices.
- Automated and real-time surveillance at workplaces has proven to be harmful and distressful, as the UK Parliament pointed out in a recent report. The legitimate interest of “reporting criminal acts or other safeguarding concerns” would legitimise such practices in principle, regardless of the risks for workers’ mental health.
On top of that, the UK Government is proposing to scrap Data Protection Impact Assessments, a procedure meant to identify and mitigate risks of adverse consequences for individuals whose data is being used.
In short, Government proposals would relieve organisations from considering the harmful and discriminatory of their data practices, reducing accountability and enabling malpractices to fly under the radar.
Reducing transparency and disempowering individuals
Transparency means that individuals have the right to know who is collecting their data and why. This includes the right of access: to be told what data have been collected about them and how they are being used.
However, the UK Government is proposing to undermine individuals’ right of access to their own data, with a number of proposals that would:
- Impose a nominal fee for individuals exercising their right of access, and to expose the reasons they are making such requests.
- Empower organisations to reject requests to access personal data they hold about someone, either because of the motives of such requests or because of a cost-capping limit.
These proposals would depart from the principle, enshrined in the GDPR, that fundamental rights must be exercised free of charge. They would have a chilling effect on individuals, either because of the costs involved or the perspective to disclose the reasons behind their requests. They would allow arbitrary and unreasonable rejections of access requests.
Finally, individuals would be required to try to resolve their complaints with the offenders before lodging a complaint to the Information Commissioner’s Office. This will jeopardise individuals’ access to one of the most popular and accessible remedies they could enjoy under the UK GDPR, and increase the ICO discretion over if and how to handle complaints. Further, individuals will not always be in the position to know or identify offenders, and approaching abusers may be unwarranted.
Scrapping the right to human review of automated decisions
The transparency, accountability and fairness of automated systems depend on the organisations that implement them. Thus, the UK GDPR imposes a duty on organisations to review the fairness and legality of automated life-changing decisions taken against individuals. This is also known as “right to human review”.
However, the Government are considering removing such right, shifting this burden from organisations to individuals. This is fundamentally unfair, as individuals would be asked to actively monitor and scrutinise life-changing decisions taken against them, by systems that are out of their control or understanding.
Undermining accountability and enforcement
Organisations have an important role in making data protection work in practice, and give material effect to the rights and freedom of individuals. Data Protection Authorities are also indispensable for effectively protecting individuals’ rights and freedoms, as the use of data becomes increasingly pervasive and complex.
However, the Government are proposing to:
- Scrap the GDPR accountability framework, in favour of vaguely defined and easy-to-game Privacy Management Programmes.
- Water down the Information Commissioner’s Office enforcement power, and bring them under Government control.
Privacy Management Programmes would effectively allow organisations to demonstrate compliance with the law in their own terms, opening the floodgates to abuses and dodginess.
This will make it more difficult to audit organisations and hold them to account. Even then, weakened independent oversight will mean that the ICO will have to consider the impact on the economic interests of the offenders before enforcing the law, putting profits and the greater good of “growth and innovation” before individuals’ rights.
Finally, the Government are proposing to be given the power to state the annual strategic priorities of the ICO, as well as to be given the power to amend the salary of the Commissioner without Parliamentary scrutiny. This will undermine the ICO’s ability to investigate and enforce the law against Government desires, and expose Commissioners to political retaliation.
Sector-specific issues: electronic communications, political parties’ use of data, digital trade
Some of Government proposals are also targeted at some specific uses of data.
In electronic communications, the Government praise behavioural advertising and proposes to relax consent on cookies and online tracking. However, both the US Federal Trade Commission and the European Parliament are considering an outright ban of behavioural advertising because of the harms it produces, the potential to discriminate, and the widespread illegal conduct of adtech companies. Even in the UK, the ICO released two Reports, in 2019 and 2021, that both confirmed the unsustainability of these practices.
Coming to the use of data for political campaigning, the Government conveniently forget about the Cambridge Analytica scandal and propose to expand the ability to profile and use personal data of individuals for electoral purposes.
Finally, the Government propose to weaken the protections for personal data in international data transfers. The current regime is based on “adequacy decisions”, that are meant to assess and ensure that data being transferred oversea do not result in risks or reduced legal safeguards for the individuals concerned. However, the Government would turn “adequacy assessment” into an instrument to boost international trade, and ease international transfers even when this strips individuals of their rights.
Hear the latest
Sign up to receive updates about Open Rights Group’s work to protect our digital rights.
Subscribe